Skip to content

Instantly share code, notes, and snippets.

localeIdentifier Description
eu Basque
hr_BA Croatian (Bosnia & Herzegovina)
en_CM English (Cameroon)
en_BI English (Burundi)
rw_RW Kinyarwanda (Rwanda)
ast Asturian
en_SZ English (Swaziland)
he_IL Hebrew (Israel)
ar Arabic
[
"WAT": "Africa/Lagos",
"EAT": "Africa/Addis_Ababa",
"MDT": "America/Denver",
"ICT": "Asia/Bangkok",
"TRT": "Europe/Istanbul",
"PHT": "Asia/Manila",
"CAT": "Africa/Harare",
"EET": "Europe/Athens",
"SGT": "Asia/Singapore",
[
"Africa/Abidjan",
"Africa/Accra",
"Africa/Addis_Ababa",
"Africa/Algiers",
"Africa/Asmara",
"Africa/Bamako",
"Africa/Bangui",
"Africa/Banjul",
"Africa/Bissau",

Given a subscribed calendar with a url like

https://example.com/example.ics

To force Google Calendar to refresh and reload the contents right now, unsubscribe from the calendar and subscribe to a new calendar with a URL like

https://example.com/example.ics#1

Adding the anchor tag will force Google Calendar to think of it as a new calendar

@CMCDragonkai
CMCDragonkai / memory_layout.md
Last active April 28, 2024 18:50
Linux: Understanding the Memory Layout of Linux Executables

Understanding the Memory Layout of Linux Executables

Required tools for playing around with memory:

  • hexdump
  • objdump
  • readelf
  • xxd
  • gcore
@arkatsy
arkatsy / zustand-internals.jsx
Last active April 28, 2024 18:44
How zustand works internally
import { useSyncExternalStore } from "react";
// For more on the useSyncExternalStore hook, see https://react.dev/reference/react/useSyncExternalStore
// The code is almost identical to the source code of zustand, without types and some features stripped out.
// Check the links to see the references in the source code.
// The links are referencing the v5 of the library. If you plan on reading the source code yourself v5 is the best way to start.
// The current v4 version contains lot of deprecated code and extra stuff that makes it hard to reason about if you're new to this.
// https://github.com/pmndrs/zustand/blob/fe47d3e6c6671dbfb9856fda52cb5a3a855d97a6/src/vanilla.ts#L57-L94
function createStore(createState) {
@ttwizz
ttwizz / RagdollService.lua
Created April 28, 2024 18:40
Ragdoll Service for Roblox Studio
--!nocheck
local RagdollService = {}
export type ConstraintsInfo = {
[string]: {
[string]: boolean | string | number
}
@jumbojets
jumbojets / autograd.ml
Last active April 28, 2024 18:42
proof of concept autograd implementation in ocaml
(* TODO: multicore and/or gpu computation would be fun *)
(* TODO: mnist sample *)
module Matrix = struct
module FA = Float.Array
type t = { data : FA.t; cols : int }
let make m n value =
let data = FA.make (m * n) value in
@rain-1
rain-1 / llama-home.md
Last active April 28, 2024 18:42
How to run Llama 13B with a 6GB graphics card

This worked on 14/May/23. The instructions will probably require updating in the future.

llama is a text prediction model similar to GPT-2, and the version of GPT-3 that has not been fine tuned yet. It is also possible to run fine tuned versions (like alpaca or vicuna with this. I think. Those versions are more focused on answering questions)

Note: I have been told that this does not support multiple GPUs. It can only use a single GPU.

It is possible to run LLama 13B with a 6GB graphics card now! (e.g. a RTX 2060). Thanks to the amazing work involved in llama.cpp. The latest change is CUDA/cuBLAS which allows you pick an arbitrary number of the transformer layers to be run on the GPU. This is perfect for low VRAM.

  • Clone llama.cpp from git, I am on commit 08737ef720f0510c7ec2aa84d7f70c691073c35d.